3 research outputs found

    A workflow for designing stylized shading effects

    Get PDF
    In this report, we describe a workflow for designing stylized shading effects on a 3D object, targeted at technical artists. Shading design, the process of making the illumination of an object in a 3D scene match an artist vision, is usually a time-consuming task because of the complex interactions between materials, geometry, and lighting environment. Physically based methods tend to provide an intuitive and coherent workflow for artists, but they are of limited use in the context of non-photorealistic shading styles. On the other hand, existing stylized shading techniques are either too specialized or require considerable hand-tuning of unintuitive parameters to give a satisfactory result. Our contribution is to separate the design process of individual shading effects in three independent stages: control of its global behavior on the object, addition of procedural details, and colorization. Inspired by the formulation of existing shading models, we expose different shading behaviors to the artist through parametrizations, which have a meaningful visual interpretation. Multiple shading effects can then be composited to obtain complex dynamic appearances. The proposed workflow is fully interactive, with real-time feedback, and allows the intuitive exploration of stylized shading effects, while keeping coherence under varying viewpoints and light configurations. Furthermore, our method makes use of the deferred shading technique, making it easily integrable in existing rendering pipelines.Dans ce rapport, nous décrivons un outil de création de modèles d'illumination adapté à la stylisation de scènes 3D. Contrairement aux modèles d'illumination photoréalistes, qui suivent des contraintes physiques, les modèles d'illumination stylisés répondent à des contraintes artistiques, souvent inspirées de la représentation de la lumière en illustration. Pour cela, la conception de ces modèles stylisés est souvent complexe et coûteuse en temps. De plus, ils doivent produire un résultat cohérent sous une multitude d'angles de vue et d'éclairages. Nous proposons une méthode qui facilite la création d'effets d'illumination stylisés, en décomposant le processus en trois parties indépendantes: contrôle du comportement global de l'illumination, ajout de détails procéduraux, et colorisation.Différents comportements d'illumination sont accessibles à travers des paramétrisations, qui ont une interprétation visuelle, et qui peuvent être combinées pour obtenir des apparences plus complexes. La méthode proposée est interactive, et permet l'exploration efficace de modèles d'illumination stylisés. La méthode est implémentée avec la technique de deferred shading, ce qui la rend facilement utilisable dans des pipelines de rendu existants

    A workflow for designing stylized shading effects

    No full text
    In this report, we describe a workflow for designing stylized shading effects on a 3D object, targeted at technical artists. Shading design, the process of making the illumination of an object in a 3D scene match an artist vision, is usually a time-consuming task because of the complex interactions between materials, geometry, and lighting environment. Physically based methods tend to provide an intuitive and coherent workflow for artists, but they are of limited use in the context of non-photorealistic shading styles. On the other hand, existing stylized shading techniques are either too specialized or require considerable hand-tuning of unintuitive parameters to give a satisfactory result. Our contribution is to separate the design process of individual shading effects in three independent stages: control of its global behavior on the object, addition of procedural details, and colorization. Inspired by the formulation of existing shading models, we expose different shading behaviors to the artist through parametrizations, which have a meaningful visual interpretation. Multiple shading effects can then be composited to obtain complex dynamic appearances. The proposed workflow is fully interactive, with real-time feedback, and allows the intuitive exploration of stylized shading effects, while keeping coherence under varying viewpoints and light configurations. Furthermore, our method makes use of the deferred shading technique, making it easily integrable in existing rendering pipelines.Dans ce rapport, nous décrivons un outil de création de modèles d'illumination adapté à la stylisation de scènes 3D. Contrairement aux modèles d'illumination photoréalistes, qui suivent des contraintes physiques, les modèles d'illumination stylisés répondent à des contraintes artistiques, souvent inspirées de la représentation de la lumière en illustration. Pour cela, la conception de ces modèles stylisés est souvent complexe et coûteuse en temps. De plus, ils doivent produire un résultat cohérent sous une multitude d'angles de vue et d'éclairages. Nous proposons une méthode qui facilite la création d'effets d'illumination stylisés, en décomposant le processus en trois parties indépendantes: contrôle du comportement global de l'illumination, ajout de détails procéduraux, et colorisation.Différents comportements d'illumination sont accessibles à travers des paramétrisations, qui ont une interprétation visuelle, et qui peuvent être combinées pour obtenir des apparences plus complexes. La méthode proposée est interactive, et permet l'exploration efficace de modèles d'illumination stylisés. La méthode est implémentée avec la technique de deferred shading, ce qui la rend facilement utilisable dans des pipelines de rendu existants

    Motion-coherent stylization with screen-space image filters

    Get PDF
    International audienceOne of the qualities sought in expressive rendering is the 2D impression of the resulting style, called flatness. In the context of 3D scenes, screen-space stylization techniques are good candidates for flatness as they operate in the 2D image plane, after the scene has been rendered into G-buffers. Various stylization filters can be applied in screen-space while making use of the geometrical information contained in G-buffers to ensure motion coherence. However, this means that filtering can only be done inside the rasterized surface of the object. This can be detrimental to some styles that require irregular silhouettes to be convincing. In this paper, we describe a post-processing pipeline that allows stylization filters to extend outside the rasterized footprint of the object by locally “inflating” the data contained in G-buffers. This pipeline is fully implemented on the GPU and can be evaluated at interactive rates. We show how common image filtering techniques, when integrated in our pipeline and in combination with G-buffer data, can be used to reproduce a wide range of “digitally-painted” appearances, such as directed brush strokes with irregular silhouettes, while keeping enough motion coherence
    corecore